Aggregation and Sparsity Via l1 Penalized Least Squares

نویسندگان

  • Florentina Bunea
  • Alexandre B. Tsybakov
  • Marten H. Wegkamp
چکیده

This paper shows that near optimal rates of aggregation and adaptation to unknown sparsity can be simultaneously achieved via `1 penalized least squares in a nonparametric regression setting. The main tool is a novel oracle inequality on the sum between the empirical squared loss of the penalized least squares estimate and a term reflecting the sparsity of the unknown regression function.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sparse Density Estimation with l1 Penalties

This paper studies oracle properties of `1-penalized estimators of a probability density. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of non-zero components of the oracle vector. The results are valid even when the dimension of the model is (much) larger than the sample size. They are applied to estimation in spa...

متن کامل

Reweighted l1-norm Penalized LMS for Sparse Channel Estimation and Its Analysis

A new reweighted l1-norm penalized least mean square (LMS) algorithm for sparse channel estimation is proposed and studied in this paper. Since standard LMS algorithm does not take into account the sparsity information about the channel impulse response (CIR), sparsity-aware modifications of the LMS algorithm aim at outperforming the standard LMS by introducing a penalty term to the standard LM...

متن کامل

Sparsity oracle inequalities for the Lasso

This paper studies oracle properties of !1-penalized least squares in nonparametric regression setting with random design. We show that the penalized least squares estimator satisfies sparsity oracle inequalities, i.e., bounds in terms of the number of non-zero components of the oracle vector. The results are valid even when the dimension of the model is (much) larger than the sample size and t...

متن کامل

Sparse channel estimation with lp-norm and reweighted l1-norm penalized least mean squares

The least mean squares (LMS) algorithm is one of the most popular recursive parameter estimation methods. In its standard form it does not take into account any special characteristics that the parameterized model may have. Assuming that such model is sparse in some domain (for example, it has sparse impulse or frequency response), we aim at developing such LMS algorithms that can adapt to the ...

متن کامل

Asymptotic distribution and sparsistency for l1 penalized parametric M-estimators, with applications to linear SVM and logistic regression

Since its early use in least squares regression problems, the l1-penalization framework for variable selection has been employed in conjunction with a wide range of loss functions encompassing regression, classification and survival analysis. While a well developed theory exists for the l1-penalized least squares estimates, few results concern the behavior of l1-penalized estimates for general ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006